Risk minimization, probability elicitation, and cost-sensitive SVMs
نویسندگان
چکیده
A new procedure for learning cost-sensitive SVM classifiers is proposed. The SVM hinge loss is extended to the cost sensitive setting, and the cost-sensitive SVM is derived as the minimizer of the associated risk. The extension of the hinge loss draws on recent connections between risk minimization and probability elicitation. These connections are generalized to costsensitive classification, in a manner that guarantees consistency with the cost-sensitive Bayes risk, and associated Bayes decision rule. This ensures that optimal decision rules, under the new hinge loss, implement the Bayes-optimal costsensitive classification boundary. Minimization of the new hinge loss is shown to be a generalization of the classic SVM optimization problem, and can be solved by identical procedures. The resulting algorithm avoids the shortcomings of previous approaches to cost-sensitive SVM design, and has superior experimental performance.
منابع مشابه
Cost-Sensitive Support Vector Machines
A new procedure for learning cost-sensitive SVM(CS-SVM) classifiers is proposed. The SVM hinge loss is extended to the cost sensitive setting, and the CS-SVM is derived as the minimizer of the associated risk. The extension of the hinge loss draws on recent connections between risk minimization and probability elicitation. These connections are generalized to cost-sensitive classification, in a...
متن کاملA particle swarm optimization algorithm for minimization analysis of cost-sensitive attack graphs
To prevent an exploit, the security analyst must implement a suitable countermeasure. In this paper, we consider cost-sensitive attack graphs (CAGs) for network vulnerability analysis. In these attack graphs, a weight is assigned to each countermeasure to represent the cost of its implementation. There may be multiple countermeasures with different weights for preventing a single exploit. Also,...
متن کاملA comparative analysis of structural risk minimization by support vector machines and nearest neighbor rule
Support Vector Machines (SVMs) are by far the most sophisticated and powerful classifiers available today. However, this robustness and novelty in approach come at a large computational cost. On the other hand, Nearest Neighbor classifiers provide a simple yet robust approach that is guaranteed to converge to a result. In this paper, we present a technique that combines these two classifiers by...
متن کاملA Nearest Neighbor Classifier based on Structural Risk Minimization
Support Vector Machines are by far the most sophisticated and powerful classifiers available today. However, this robustness and novelty in approach come at a large computational cost. On the other hand, Nearest Neighbor classifiers provide a simple yet robust approach that is guaranteed to converge to a result. In this paper, we present a technique that combines these two classifiers by adopti...
متن کاملOn Consistent Surrogate Risk Minimization and Property Elicitation
Surrogate risk minimization is a popular framework for supervised learning; property elicitation is a widely studied area in probability forecasting, machine learning, statistics and economics. In this paper, we connect these two themes by showing that calibrated surrogate losses in supervised learning can essentially be viewed as eliciting or estimating certain properties of the underlying con...
متن کامل